145 research outputs found
On the Complexity of Nash Equilibria in Anonymous Games
We show that the problem of finding an {\epsilon}-approximate Nash
equilibrium in an anonymous game with seven pure strategies is complete in
PPAD, when the approximation parameter {\epsilon} is exponentially small in the
number of players.Comment: full versio
Unbounded Differentially Private Quantile and Maximum Estimation
In this work we consider the problem of differentially private computation of
quantiles for the data, especially the highest quantiles such as maximum, but
with an unbounded range for the dataset. We show that this can be done
efficiently through a simple invocation of , a
subroutine that is iteratively called in the fundamental Sparse Vector
Technique, even when there is no upper bound on the data. In particular, we
show that this procedure can give more accurate and robust estimates on the
highest quantiles with applications towards clipping that is essential for
differentially private sum and mean estimation. In addition, we show how two
invocations can handle the fully unbounded data setting. Within our study, we
show that an improved analysis of can improve the
privacy guarantees for the widely used Sparse Vector Technique that is of
independent interest. We give a more general characterization of privacy loss
for which we immediately apply to our method for
improved privacy guarantees. Our algorithm only requires one pass
through the data, which can be unsorted, and each subsequent query takes
time. We empirically compare our unbounded algorithm with the state-of-the-art
algorithms in the bounded setting. For inner quantiles, we find that our method
often performs better on non-synthetic datasets. For the maximal quantiles,
which we apply to differentially private sum computation, we find that our
method performs significantly better
Nearly Tight Bounds for Sandpile Transience on the Grid
We use techniques from the theory of electrical networks to give nearly tight
bounds for the transience class of the Abelian sandpile model on the
two-dimensional grid up to polylogarithmic factors. The Abelian sandpile model
is a discrete process on graphs that is intimately related to the phenomenon of
self-organized criticality. In this process, vertices receive grains of sand,
and once the number of grains exceeds their degree, they topple by sending
grains to their neighbors. The transience class of a model is the maximum
number of grains that can be added to the system before it necessarily reaches
its steady-state behavior or, equivalently, a recurrent state. Through a more
refined and global analysis of electrical potentials and random walks, we give
an upper bound and an lower bound for the
transience class of the grid. Our methods naturally extend to
-sized -dimensional grids to give upper
bounds and lower bounds.Comment: 36 pages, 4 figure
Green Infrastructure as a Campus Storm Water Management Tecchnique
The primary impact of urbanization to water resources is the increase in impervious surfaces from buildings, parking lots, and transportation corridors. This hardening of an urban watershed can dramatically increase runoff, creating more extreme and more frequent flood events, as well as reducing recharge to groundwater and summer base flows. Urbanization also results in an increase in the types and severity of pollutants. Associated with modified flows is an increase in concentrations and total loads of pollutants, and a decrease in the watershed’s natural ability to assimilate these pollutants.
Percent ISA (impervious surface area) in small urban watersheds has been suggested as a predictor of cumulative impacts to water quality resulting from urbanization (Chester and Gibbons, 1996). Cumulative ISA greater than 10% appears to put water resources at risk (Mesner, et al, 2015; Arentsen et al, 2004, Brabec et al, 2002), while watersheds with greater than 25% ISA often have impacted water bodies. Best management practices (BMPs) can mitigate against these impacts. BMPs include landscape features such as grassy swales, retention and detention basins. These are designed to collect and increase infiltration of runoff from parking lots, new subdivisions and other areas of concentrated ISA (Jia et al, 2012).
This research specifically explores how green roofs provide a similar benefit. Precipitation soaks into specially designed vegetated areas on roofs, slowing down and reducing runoff. Green roofs provide the added benefit of reducing temperatures on roof tops, thus mitigating the “heat island” effect seen with many urban areas.
The project also includes research on storm water management at USU Logan\u27s main campus through a green infrastructure master plan and educational outreach implementation
Sampling Random Spanning Trees Faster than Matrix Multiplication
We present an algorithm that, with high probability, generates a random
spanning tree from an edge-weighted undirected graph in
time (The notation hides
factors). The tree is sampled from a distribution
where the probability of each tree is proportional to the product of its edge
weights. This improves upon the previous best algorithm due to Colbourn et al.
that runs in matrix multiplication time, . For the special case of
unweighted graphs, this improves upon the best previously known running time of
for (Colbourn
et al. '96, Kelner-Madry '09, Madry et al. '15).
The effective resistance metric is essential to our algorithm, as in the work
of Madry et al., but we eschew determinant-based and random walk-based
techniques used by previous algorithms. Instead, our algorithm is based on
Gaussian elimination, and the fact that effective resistance is preserved in
the graph resulting from eliminating a subset of vertices (called a Schur
complement). As part of our algorithm, we show how to compute
-approximate effective resistances for a set of vertex pairs via
approximate Schur complements in time,
without using the Johnson-Lindenstrauss lemma which requires time. We
combine this approximation procedure with an error correction procedure for
handing edges where our estimate isn't sufficiently accurate
Fully Dynamic Effective Resistances
In this paper we consider the \emph{fully-dynamic} All-Pairs Effective
Resistance problem, where the goal is to maintain effective resistances on a
graph among any pair of query vertices under an intermixed sequence of edge
insertions and deletions in . The effective resistance between a pair of
vertices is a physics-motivated quantity that encapsulates both the congestion
and the dilation of a flow. It is directly related to random walks, and it has
been instrumental in the recent works for designing fast algorithms for
combinatorial optimization problems, graph sparsification, and network science.
We give a data-structure that maintains -approximations to
all-pair effective resistances of a fully-dynamic unweighted, undirected
multi-graph with expected amortized
update and query time, against an oblivious adversary. Key to our result is the
maintenance of a dynamic \emph{Schur complement}~(also known as vertex
resistance sparsifier) onto a set of terminal vertices of our choice.
This maintenance is obtained (1) by interpreting the Schur complement as a
sum of random walks and (2) by randomly picking the vertex subset into which
the sparsifier is constructed. We can then show that each update in the graph
affects a small number of such walks, which in turn leads to our sub-linear
update time. We believe that this local representation of vertex sparsifiers
may be of independent interest
- …